翻訳と辞書
Words near each other
・ Vowel
・ Vowel breaking
・ Vowel diagram
・ Vowel dimension
・ Vowel harmony
・ Vowel length
・ Vowel pointing
・ Vowel reduction
・ Vowel reduction in Russian
・ Vowel shift
・ Vowell
・ Vowel–consonant synthesis
・ Vowinckel
・ Vowinckel, Pennsylvania
・ Vowles
Vowpal Wabbit
・ VOWR
・ Vows (album)
・ Vows (band)
・ Vows (Dollhouse)
・ Vox
・ Vox (blogging platform)
・ Vox (journal)
・ Vox (magazine)
・ Vox (musical equipment)
・ VOX (Norwegian TV channel)
・ Vox (software)
・ Vox (song)
・ Vox (Spanish political party)
・ Vox (The Edge Chronicles)


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

Vowpal Wabbit : ウィキペディア英語版
Vowpal Wabbit

Vowpal Wabbit (also known as "VW") is an open source fast out-of-core learning system library and program developed originally at Yahoo! Research, and currently at Microsoft Research. It was started and is led by John Langford. Vowpal Wabbit is notable as an efficient scalable implementation of online machine learning and support for a number of machine learning reductions, importance weighting, and a selection of different loss functions and optimization algorithms.
==Notable Features==
The VW program supports:
* Multiple supervised (and semi-supervised) learning problems:
*
* Classification (both binary and multi-class)
*
* Regression
*
* Active learning (partially labeled data) for both regression and classification
* Multiple learning algorithms (model-types / representations)
*
* OLS regression
*
* Matrix factorization (sparse matrix SVD)
*
* Single layer neural net (with user specified hidden layer node count)
*
* Searn (Search and Learn)
*
* Latent Dirichlet Allocation (LDA)
*
* Stagewise polynomial approximation
*
* Recommend top-K out of N
*
* One-against-all (OAA) and cost-sensitive OAA reduction for multi-class
*
* Weighted all pairs
*
* Contextual-bandit
* Multiple loss functions:
*
* squared error
*
* quantile
*
* hinge
*
* logistic
* Multiple optimization algorithms
*
* Stochastic gradient descent (SGD)
*
* BFGS
*
* Conjugate gradient
* Regularization (L1 norm, L2 norm, & elastic net regularization)
* Flexible input - input features may be:
*
* Binary
*
* Numerical
*
* Categorical (via flexible feature-naming and the hash trick)
*
* Can deal with missing values/sparse-features
* Other features
*
* On the fly generation of feature interactions (quadratic and cubic)
*
* On the fly generation of N-grams with optional skips (useful for word/language data-sets)
*
* Automatic test-set holdout and early termination on multiple passes
*
* bootstrapping
*
* User settable online learning progress report + auditing of the model

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Vowpal Wabbit」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.